Is the USA today right or left leaning?
Could you elaborate on the political leanings of the USA today? Are there signs that indicate a shift towards a more left or right-wing ideology? What factors are influencing these potential changes, and how do they compare to previous political landscapes in the country? Furthermore, what are the implications of these leanings on various aspects of American society, including economic policy, social welfare, and foreign relations?